The Information Criterion

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Entropy Regularization Information Criterion

Effective methods of capacity control via uniform convergence bounds for function expansions have been largely limited to Support Vector machines, where good bounds are obtainable by the entropy number approach. We extend these methods to systems with expansions in terms of arbitrary (parametrized) basis functions and a wide range of regularization methods covering the whole range of general li...

متن کامل

Kernel-Based Information Criterion

This paper introduces Kernel-based Information Criterion (KIC) for model selection in regression analysis. The kernel-based complexity measure in KIC efficiently computes the interdependency between parameters of the model using a novel variable-wise variance and yields selection of better, more robust regressors. Experimental results show superior performance on both simulated and real data se...

متن کامل

Smola The Entropy Regularization Information Criterion

Effective methods of capacity control via uniform convergence bounds for function expansions have been largely limited to Support Vector machines, where good bounds are obtainable by the entropy number approach. We extend these methods to systems with expansions in terms of arbitrary (parametrized) basis functions and a wide range of regularization methods covering the whole range of general li...

متن کامل

Exponential Smoothing and the Akaike Information Criterion

Using an innovations state space approach, it has been found that the Akaike information criterion (AIC) works slightly better, on average, than prediction validation on withheld data, for choosing between the various common methods of exponential smoothing for forecasting. There is, however, a puzzle. Should the count of the seed states be incorporated into the penalty term in the AIC formula?...

متن کامل

Improving Precision of the Subspace Information Criterion

Evaluating the generalization performance of learning machines without using additional test samples is one of the most important issues in the machine learning community. The subspace information criterion (SIC) is one of the methods for this purpose, which is shown to be an unbiased estimator of the generalization error with finite samples. Although the mean of SIC agrees with the true genera...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Modern Applied Statistical Methods

سال: 2014

ISSN: 1538-9472

DOI: 10.22237/jmasm/1414815840